-
Notifications
You must be signed in to change notification settings - Fork 29k
allow symlinking to shell scripts #2386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
|
Again, still looks like a duplicate of #1875 |
bin/spark-shell
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You may have to quote these - so that dirs with spaces within their name work. Like they do at the moment.
Above should look like.
FWDIR="$(cd "$(dirname "$(readlink -f "$0")")"/..; pwd)"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ofcourse this applies to all other places as well.
|
Note that this PR won't support cases of symlink to symlink to Spark (i.e. multiple redirections). The PR I submitted at some point (#1875) shows how to implement this properly with a loop. |
|
Thanks for working on this, however, since its a duplicate I think we should probably close this issue and continue any discussion on #1875. |
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot. For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink. Instead of using `readlink -f` like what this PR (apache#2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop. I've tested with Mac and Linux (Cent OS), looks fine. This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also? Please help to review, any comment is greatly appreciated. Author: jerryshao <[email protected]> Author: Shay Rojansky <[email protected]> Closes apache#8669 from jerryshao/SPARK-2960.
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot. For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink. Instead of using `readlink -f` like what this PR (apache/spark#2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop. I've tested with Mac and Linux (Cent OS), looks fine. This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also? Please help to review, any comment is greatly appreciated. Author: jerryshao <[email protected]> Author: Shay Rojansky <[email protected]> Closes #8669 from jerryshao/SPARK-2960.
patch for SPARK-3482